Learning the kernel matrix by maximizing a KFD-based class separability criterion

نویسندگان

  • Dit-Yan Yeung
  • Hong Chang
  • Guang Dai
چکیده

The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Quasiconformal Kernel Fisher Discriminant Analysis via Weighted Maximum Margin Criterion

Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...

متن کامل

Feature Selection of Support Vector Domain Description Using Gaussian Kernel

The performance of the kernel-based learning algorithms, such as support vector domain description, depends heavily on the proper choice of the kernel parameter. It is desirable for the kernel machines to work on the optimal kernel parameter that adapts well to the input data and the pattern classification tasks. In this paper we present a novel algorithm to optimize the Gaussian kernel paramet...

متن کامل

Kernel second-order discriminants versus support vector machines

Support vector machines (SVMs) are the most well known nonlinear classifiers based on the Mercer kernel trick. They generally leads to very sparse solutions that ensure good generalization performance. Recently Mika et al. have proposed a new nonlinear technique based on the kernel trick and the Fisher criterion: the nonlinear kernel Fisher discriminant (KFD). Experiments show that KFD is compe...

متن کامل

Learning Kernel Parameters by using Class Separability Measure

Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this ...

متن کامل

Gaussian kernel optimization for pattern classification

This paper presents a novel algorithm to optimize the Gaussian kernel for pattern classification tasks, where it is desirable to have well-separated samples in the kernel feature space. We propose to optimize the Gaussian kernel parameters by maximizing a classical class separability criterion, and the problem is solved through a quasi-Newton algorithm by making use of a recently proposed decom...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Pattern Recognition

دوره 40  شماره 

صفحات  -

تاریخ انتشار 2007